81 research outputs found

    Intelligent wristbands for the automatic detection of emotional states for the elderly

    Get PDF
    Over the last few years, research on computational intelligence is being conducted to detect emotional states of people. This paper proposes the use of intelligent wristbands for the automatic detection of emotional states to develop an application which allows to monitor older people in order to improve their quality of life. The paper describes the hardware design and the cognitive module that allows the recognition of the emotional states. The proposed wristband also integrates a camera that improves the emotion detection.- Programa Operacional Temático Factores de Competitividade(POCI-01-0145-). MINECO/FEDER TIN2015-65515-C4- 1-R and the FPI grant AP2013-01276 awarded to Jaime-Andres Rincon. This work is supported by COMPETE: POCI-01-0145-FEDER-007043 and FCT - Fundação para a Ciência e Tecnologia within the projects UID/CEC/00319/2013 and Post-Doc scholarship SFRH/BPD/102696/201

    EEG correlates of different emotional states elicited during watching music videos

    Get PDF
    Studying emotions has become increasingly popular in various research fields. Researchers across the globe have studied various tools to implicitly assess emotions and affective states of people. Human computer interface systems specifically can benefit from such implicit emotion evaluator module, which can help them determine their users' affective states and act accordingly. Brain electrical activity can be considered as an appropriate candidate for extracting emotion-related cues, but it is still in its infancy. In this paper, the results of analyzing the Electroencephalogram (EEG) for assessing emotions elicited during watching various pre-selected emotional music video clips have been reported. More precisely, in-depth results of both subject-dependent and subject-independent correlation analysis between time domain, and frequency domain features of EEC signal and subjects' self assessed emotions are produced and discussed

    Extracting Relevance and Affect Information from Physiological Text Annotation

    Get PDF
    We present physiological text annotation, which refers to the practice of associating physiological responses to text content in order to infer characteristics of the user information needs and affective responses. Text annotation is a laborious task, and implicit feedback has been studied as a way to collect annotations without requiring any explicit action from the user. Previous work has explored behavioral signals, such as clicks or dwell time to automatically infer annotations, and physiological signals have mostly been explored for image or video content. We report on two experiments in which physiological text annotation is studied first to 1) indicate perceived relevance and then to 2) indicate affective responses of the users. The first experiment tackles the user’s perception of relevance of an information item, which is fundamental towards revealing the user’s information needs. The second experiment is then aimed at revealing the user’s affective responses towards a -relevant- text document. Results show that physiological user signals are associated with relevance and affect. In particular, electrodermal activity (EDA) was found to be different when users read relevant content than when they read irrelevant content and was found to be lower when reading texts with negative emotional content than when reading texts with neutral content. Together, the experiments show that physiological text annotation can provide valuable implicit inputs for personalized systems. We discuss how our findings help design personalized systems that can annotate digital content using human physiology without the need for any explicit user interaction

    Age-related craniofacial differences based on spatio-temporal face image atlases

    Get PDF
    A number of studies have been developed recently in order to explore associations between craniofacial differences and genetics. Most of these works have been based on spatial face image models, adjusted for the counter effects of age. This approach provides a limited understanding of normal and abnormal craniofacial development owing to the lack of age progression information. Here, the authors propose and implement an imaging framework that combines facial landmark positioning, non-rigid registration, novel age-dependent face modelling and common distance metrics to disclose the most facial differences that vary across the time due to the subjects' age. All the experiments carried out and corresponding results presented here are based on a database comprising ordinary two-dimensional (2D) frontal face images of Down Syndrome (DS) and control sample groups. A number of craniofacial metrics have been successfully identified that highlight statistically significant and clinically relevant differences between the controls and the faces associated with DS within the age range from 1 to 18 years old, producing realistic unbiased face models with similar level of detail at all age-intervals, despite the small sample size available

    Affective recognition from EEG signals: an integrated data-mining approach

    Get PDF
    Emotions play an important role in human communication, interaction, and decision making processes. Therefore, considerable efforts have been made towards the automatic identification of human emotions, in particular electroencephalogram (EEG) signals and Data Mining (DM) techniques have been then used to create models recognizing the affective states of users. However, most previous works have used clinical grade EEG systems with at least 32 electrodes. These systems are expensive and cumbersome, and therefore unsuitable for usage during normal daily activities. Smaller EEG headsets such as the Emotiv are now available and can be used during daily activities. This paper investigates the accuracy and applicability of previous affective recognition methods on data collected with an Emotiv headset while participants used a personal computer to fulfill several tasks. Several features were extracted from four channels only (AF3, AF4, F3 and F4 in accordance with the 10–20 system). Both Support Vector Machine and Naïve Bayes were used for emotion classification. Results demonstrate that such methods can be used to accurately detect emotions using a small EEG headset during a normal daily activity

    Towards emotion recognition for virtual environments: an evaluation of eeg features on benchmark dataset

    Get PDF
    One of the challenges in virtual environments is the difficulty users have in interacting with these increasingly complex systems. Ultimately, endowing machines with the ability to perceive users emotions will enable a more intuitive and reliable interaction. Consequently, using the electroencephalogram as a bio-signal sensor, the affective state of a user can be modelled and subsequently utilised in order to achieve a system that can recognise and react to the user’s emotions. This paper investigates features extracted from electroencephalogram signals for the purpose of affective state modelling based on Russell’s Circumplex Model. Investigations are presented that aim to provide the foundation for future work in modelling user affect to enhance interaction experience in virtual environments. The DEAP dataset was used within this work, along with a Support Vector Machine and Random Forest, which yielded reasonable classification accuracies for Valence and Arousal using feature vectors based on statistical measurements and band power from the and waves and High Order Crossing of the EEG signal
    corecore